The Israeli Air Force (IAF) has developed a simulation system to train its top commanders in how to use defensive resources in the face of an aerial attack by enemy combat aircraft. During the simulation session, the commander in charge allocates airborne and standby resources and dispatches or diverts aircraft to intercept intruders. Seventy-four simulation sessions were conducted in order to examine the effects of time pressure and completeness of information on the performance of twenty-nine top IAF commanders. Variables examined were: (1) display of complete versus incomplete information, (2) time-constrained decision making versus unlimited decision time, and (3) the difference in performance between top strategic commanders and mid-level field commanders. The authors' results show that complete information usually improved performance. However, field commanders (as opposed to top strategic commanders) did not improve their performance when presented with complete information under pressure of time. Time pressure usually, but not always, impaired performance. Top commanders tended to make fewer changes in previous decisions than did field commanders.
This article discusses the findings of an empirical study conducted on 303 organizations. The major purpose of the study was to analyze the relationship between various organizational attributes and the deployment of hardware resources. The salient finding was that the most influential variable is the distribution of decision-making processes in the organization. The more decision making is distributed, the more hardware is distributed. No significant relationships were detected between hardware distribution and any of the following variables: organizational structure, economic sectorial association, and the size of the organization.
Information is valuable if it derives from reliable data. However, measurements for data reliability have not been widely established in the area of information systems (IS). This paper attempts to draw some concepts of reliability from the field of quality control and to apply them to IS. The paper develops three measurements for data reliability: internal reliability--reflects the "commonly accepted" characteristics of various data items; relative reliability--indicates compliance of data to user requirements; and absolute reliability--determines the level of resemblance of data items to reality. The relationships between the three measurements are discussed, and the results of a field study are displayed and analyzed. The results provide some insightful information on the "shape" of the database that was inspected, as well as on the degree of rationality of some user requirements. General conclusions and avenues for future research are suggested.
The most common way to identify success or failure of a job running in a batch-processing mode is by examining a completion code sent by the job to the host operating system. Yet, for a variety of reasons the completion code may inaccurately indicate a successful termination of the job. This article describes a different approach to monitoring the quality of batch processing jobs while in operation. A pattern of behavior is suggested for a program. The pattern reflects ratios of consumption of various hardware resources. The ratios are determined by collecting historical performance variables of the job and analyzing the data by means of statistical methods. Once a pattern is set, the performance variables of every individual run of the program are compared with the precalculated pattern of behavior and if the deviation is beyond certain limits an alarm is triggered. The proposed quality control technique has been tested on real applications, as well as on some artificial programs. The findings suggest that the technique is reliable in that it successfully distinguishes between proper and malfunctioning runs of a program.
The Information Systems Development Life Cycle (ISDLC) is usually treated as a rigid sequence of activities. This article asserts that differences in the nature of development projects should affect the planning of the ISDLC. Two classes of factors affecting the ISDLC are identified: factors relating to the environment and factors relating to the development effort (e.g., in-house development vs. canned software package). Each step along the ISDLC is decomposed into several dimensions relating to the activities that should be performed, the degree of control that should be exerted, to human resources, to other resources, and to the time factor. The relationship between the six dimensions and the two classes of factors are explained. Finally, a practical approach to ISDLC planning is suggested based on a structured procedure and a number of working forms. It assists in preliminary planning of the development process as well as in periodic reviews and revisions whenever the project reaches a certain milestone.
A multiattribute utility approach is adopted to assess the value of an information system. Various economic analyses of the value of information are reviewed and the conceptual problems regarding the definition of this value and some measurement difficulties are discussed. A list of possible utility attributes is proposed for a reporting system value assessment, and for each attribute a measure and a utility function is suggested. Some techniques which constitute a joint utility function are presented, accompanied by two examples. A real case of minicomputer selection is given in order to illustrate the structured approach.